onfidence for the inference. For instance, if a novel data point is
the upper-left subspace shown in Figure 3.37(b), this novel data
hen labelled by or predicted as a new member of the cross class
almost 100% confidence. This is because all data points in this
belong to the cross class. Of course, if a novel data point is
n the upper subspace generated using the partitioning rule y = 0 in
37(a), the confidence of labelling this data point to the cross class
ower because this subspace has been far less pure for the cross
ose a partitioning strategy for a data space has been derived, how
artitioning space be expressed or visualised as a decision-making
r interpreting every decision which has been made using the
The plots in Figure 3.37 are certainly not very efficient to use. In
explore human intelligence of a model, a tree-like decision-
tructure has been adopted by the inductive learning algorithms.
ch a tree, how a decision is made can be clearly visualised in the
by step. Importantly, a decision-making process is sequentially
d. For instance, such a tree can show which variable is employed
nd which variable is employed in the next steps to explain why
a decision is made. Finally, such a tree can show how confident
a decision which has been made. In short words, an inductive
model can help deliver a human-intelligence-alike model.
(a) (b)
The decision-making trees for the partitioning rules derived for the data shown
37. A diamond represents a decision-making process using a partitioning rule.
presents a made decision. (a) The tree model constructed for the data shown in
(a). (b) The tree model constructed for the data shown in Figure 3.37 (b).